LivestreamMenuMake ItselectUSAINTL
LivestreamSearch quotes, news & videos
LivestreamWatchlist
SIGN INCreate free accountMarketsBusinessInvestingTechPoliticsVideoWatchlistInvesting ClubPRO
LivestreamMenu
Microsoft announced the next generation of its artificial intelligence chip, a potential alternative to leading processors from Nvidia and to offerings from cloud rivals Amazon and Google.
The Maia 200 comes two years after Microsoft said it had developed its first AI chip, the Maia 100, which was never made available for cloud clients to rent. Scott Guthrie, Microsoft's executive vice president for cloud and AI, said in a blog post Monday that, for the new chip, there will be "wider customer availability in the future."
Guthrie called the Maia 200 "the most efficient inference system Microsoft has ever deployed." Developers, academics, AI labs and people contributing to open-source AI models can apply for a preview of a software development kit.
Microsoft said its superintelligence team, led by Mustafa Suleyman, will use the new chip. The Microsoft 365 Copilot add-on for commercial productivity software bundles and the Microsoft Foundry service, for building on top of AI models, will use it as well.
Cloud providers face surging demand from generative AI model developers such as Anthropic and OpenAI and from companies building AI agents and other products on top of the popular models. Data center operators and infrastructure providers are trying to increase their computing prowess while keeping power consumption in check.
Microsoft is outfitting its U.S. Central region of data centers with Maia 200 chips, and they'll arrive at the U.S. West 3 region after that, with additional locations to follow.
The chips use Taiwan Semiconductor Manufacturing Co.'s 3 nanometer process. Four are connected together inside each server. They rely on Ethernet cables, rather than the InfiniBand standard. Nvidia sells InfiniBand switches following its 2020 Mellanox acquisition.
The chip offers 30% higher performance than alternatives for the same price, Guthrie wrote. Microsoft said each Maia 200 packs more high-bandwidth memory than a third-generation Trainium AI chip from Amazon Web Services or from Google's seventh-generation tensor processing unit.
Microsoft can achieve high performance by wiring up to 6,144 of the Maia 200 chips together, reducing energy usage and total cost of ownership, Guthrie wrote.
In 2023, Microsoft demonstrated that its GitHub Copilot coding assistant could run on Maia 100 processors.
WATCH: Chinese AI models adapt without Nvidia
Got a confidential news tip? We want to hear from you.
Sign up for free newsletters and get more CNBC delivered to your inbox
Get this delivered to your inbox, and more info about our products and services.
© 2026 Versant Media, LLC. All Rights Reserved. A Versant Media Company.
Data is a real-time snapshot *Data is delayed at least 15 minutes. Global Business and Financial News, Stock Quotes, and Market Data and Analysis.
Data also provided by